The following content has been provided by the University of Erlangen-Nürnberg.
Alright, welcome to the last lecture in Biomedical Signal Analysis for this semester.
You are invited to come back next winter if you like it.
I hope everybody will pass the exam and doesn't have to.
So, yeah, let's see.
A little bit of an appetizer or recap regarding last lecture.
So we talked about pattern recognition systems, machine learning basics, and I introduced
the pipeline for machine learning and pattern recognition.
So how does it look like?
Where does it all start with?
And you can always think like, okay, where did we start the lecture?
This wasn't designed that way, but I like the synergies and the parallels.
So what do we start with?
We started with a problem, observations.
So we started with some kind of biomedical signals.
And they exist in the world, and we need to somehow capture them for our automatic system.
So we start with sensing, sensors.
So we talked about the generation of the signals and the sensors in our second chapter
because it's important to understand the properties of your signals and the properties of the
sensing systems in order to design a very good system.
What's the next step in Heinrich Nieman's definition of a pattern recognition system?
Pre-processing.
And you can think of pre-processing as filtering, as wavelet detection of events,
of event detection, and of, for example, processes to understand your signal even better
and to enhance it for subsequent processing.
That's why it's called pre-processing.
What's the next step?
I heard it already, but I don't know who said it.
Feature extraction, features.
And the last step is automatic decision making or classification.
So assigning an object, like a measurement of a QRS complex, to one or several classes,
which might be healthy heartbeat or pathologic heartbeat.
We talked a lot about classification, and we talked about the base theorem a little bit.
We talked about the importance of prior probabilities and joint probability density functions.
So this is a joint probability density function.
This is actually not a joint. This is a conditional probability density function.
So to get more clarity, not clarity, it's a conditional.
Sorry for that.
So how do you get these probability density functions in reality?
P of x is dependent on observation of class omega i.
You need to get numbers for that.
You need to get the mean and variability of this distribution, for example.
How can you get that in practice?
Knowledge about what?
Yeah, and if I ask you for a specific problem, give me a number on P of omega i and P of omega 1 and P of omega 2.
So the prior probability of observing a certain class.
How do you give me the number?
OK, so you have a set. You have a test set, and you count, basically. You measure.
You do an experiment. You measure. You have some kind of data, and you measure.
That's a probability.
Presenters
Zugänglich über
Offener Zugang
Dauer
01:11:08 Min
Aufnahmedatum
2018-02-08
Hochgeladen am
2018-02-09 07:07:35
Sprache
de-DE